Convergence Analysis of ADMM for a Family of Nonconvex Problems

نویسندگان

  • Mingyi Hong
  • Zhi-Quan Luo
  • Meisam Razaviyayn
چکیده

In this paper, we analyze the behavior of the well-known alternating direction method of multipliers (ADMM), for solving a family of nonconvex problems. Our focus is given to the well-known consensus and sharing problems, both of which have wide applications in machine learning. We show that in the presence of nonconvex objective, the classical ADMM is able to reach the set of stationary solutions for these problems, if the stepsize is chosen large enough. An interesting consequence of our analysis is that the ADMM is convergent for a family of sharing problems, regardless of the number of blocks or the convexity of the objective function. Our analysis can be generalized to allow proximal update rules as well as other flexible block selection rules far beyond the traditional Gauss-Seidel rule.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An efficient improvement of the Newton method for solving nonconvex optimization problems

‎Newton method is one of the most famous numerical methods among the line search‎ ‎methods to minimize functions. ‎It is well known that the search direction and step length play important roles ‎in this class of methods to solve optimization problems. ‎In this investigation‎, ‎a new modification of the Newton method to solve ‎unconstrained optimization problems is presented‎. ‎The significant ...

متن کامل

Global Convergence of ADMM in Nonconvex Nonsmooth Optimization

In this paper, we analyze the convergence of the alternating direction method of multipliers (ADMM) for minimizing a nonconvex and possibly nonsmooth objective function, φ(x0, . . . , xp, y), subject to coupled linear equality constraints. Our ADMM updates each of the primal variables x0, . . . , xp, y, followed by updating the dual variable. We separate the variable y from xi’s as it has a spe...

متن کامل

Convergence rate bounds for a proximal ADMM with over-relaxation stepsize parameter for solving nonconvex linearly constrained problems

This paper establishes convergence rate bounds for a variant of the proximal alternating direction method of multipliers (ADMM) for solving nonconvex linearly constrained optimization problems. The variant of the proximal ADMM allows the inclusion of an over-relaxation stepsize parameter belonging to the interval (0, 2). To the best of our knowledge, all related papers in the literature only co...

متن کامل

Alternating Direction Method of Multipliers for a Class of Nonconvex and Nonsmooth Problems with Applications to Background/Foreground Extraction

In this paper, we study a general optimization model, which covers a large class of existing models for many applications in imaging sciences. To solve the resulting possibly nonconvex, nonsmooth and non-Lipschitz optimization problem, we adapt the alternating direction method of multipliers (ADMM) with a general dual step-size to solve a reformulation that contains three blocks of variables, a...

متن کامل

Nonconvex Sparse Spectral Clustering by Alternating Direction Method of Multipliers and Its Convergence Analysis

Spectral Clustering (SC) is a widely used data clustering method which first learns a low-dimensional embedding U of data by computing the eigenvectors of the normalized Laplacian matrix, and then performs k-means on U> to get the final clustering result. The Sparse Spectral Clustering (SSC) method extends SC with a sparse regularization on UU> by using the block diagonal structure prior of UU>...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014